34 research outputs found
Recommended from our members
A Public Record at Risk: The Dire State of News Archiving in the Digital Age
This research report explores archiving practices and policies across newspapers, magazines, wire services, and digital-only news producers, with the aim of identifying the current state of archiving and potential strategies for preserving content in an age of digital distribution. Between March 2018 and January 2019, we conducted interviews with 48 individuals from 30 news organizations and preservation initiatives. What we found was that the majority of news outlets had not given any thought to even basic strategies for preserving their digital content, and not one was properly saving a holistic record of what it produces. Of the 21 news organizations in our study, 19 were not taking any protective steps at all to archive their web output. The remaining two lacked formal strategies to ensure that their current practices have the kind of longevity to outlast changes in technology.
Meanwhile, interviewees frequently (and mistakenly) equated digital backup and storage in Google Docs or content management systems as synonymous with archiving. (They are not the same; backup refers to making copies for data recovery in case of damage or loss, while archiving refers to long-term preservation, ensuring that records will still be available even as formatting and distribution technologies change in the future.) Instead, news organizations have handed over their responsibilities as public stewards to third-party organizations such as the Internet Archive, Google, Ancestry, and ProQuest, which store and distribute copies of news content on remote servers. As such, the news cycle now includes reliance on proprietary organizations with increasing control over the public record. The Internet Archive aside, the larger issue is that their incentives are neither journalistic nor archival, and may conflict with both. While there are a number of news archiving initiatives being developed by both individuals and nonprofits, it is worth noting that preserving digital content is not, first and foremost, a technical challenge. Rather, it’s a test of human decision-making and a matter of priority. The first step in tackling an archival process is the intention to save content. News organizations must get there.
The findings of this study should be a wakeup call to an industry fond of claiming that democracy cannot be sustained without journalism, one which anchors its legitimacy on being a truth and accountability watchdog. In an era where journalism is already under attack, managing its record and future are as important as ever. Local, independent, and alternative news sources are especially at risk of not being preserved, threatening to leave critical exclusions in a record that will favor dominant versions of public history. As the sudden Gawker shutdown demonstrated in 2016, content can be confiscated and disappear instantly without archiving practices in place
Recommended from our members
Public Record Under Threat: News and the Archive in the Age of Digital Distribution
The evolution of the internet has created a vast storehouse of information, both current and historical, and all at the fingertips of the general public as well as journalists. But as we find ourselves apparently saturated by information and overwhelmed by its sources, we face a potential crisis of preservation as we seek—and often fail—to archive all manner of digital content.
News organizations have been slow to recognize and respond to the preservation challenges presented by digital technology. As a result, newsroom discussions about preservation and archiving are few and far between. However, professionals such as librarians, archivists and technologists outside of news industry are having these conversations, about retaining both conventional news content and online news, as well as about the problems inherent in trying to preserve the multitude of digital and data projects.
This shift from paper (or film) to digital record prompted the title of the conference that is the focus of this report: “Public Record Under Threat: News and the Archive in the Age of Digital Distribution.” The conference, the fourth and last in this phase of the Tow Center for Digital Journalism’s Platforms & Publishers series, arose from a recognition of the problems this shift has created in the archive..
While archivists and librarians generally agree that preservation of news content is an acute concern that must be addressed quickly, platforms and publishers often express doubt about whether they have anything to contribute to the discussion. Getting this crowd together in the same room—archivists, journalists and technologists—is a first step toward defining the problem, and subsequently mapping out solutions
Use of dietary linoleic acid for secondary prevention of coronary heart disease and death: evaluation of recovered data from the Sydney Diet Heart Study and updated meta-analysis
Objective To evaluate the effectiveness of replacing dietary saturated fat with omega 6 linoleic acid, for the secondary prevention of coronary heart disease and death.Design Evaluation of recovered data from the Sydney Diet Heart Study, a single blinded, parallel group, randomized controlled trial conducted in 1966-73; and an updated meta-analysis including these previously missing data.Setting Ambulatory, coronary care clinic in Sydney, Australia.Participants 458 men aged 30-59 years with a recent coronary event.Interventions Replacement of dietary saturated fats (from animal fats, common margarines, and shortenings) with omega 6 linoleic acid (from safflower oil and safflower oil polyunsaturated margarine). Controls received no specific dietary instruction or study foods. All non-dietary aspects were designed to be equivalent in both groups.Outcome measures All cause mortality (primary outcome), cardiovascular mortality, and mortality from coronary heart disease (secondary outcomes). We used an intention to treat, survival analysis approach to compare mortality outcomes by group.Results The intervention group (n=221) had higher rates of death than controls (n=237) (all cause 17.6% v 11.8%, hazard ratio 1.62 (95% confidence interval 1.00 to 2.64), P=0.05; cardiovascular disease 17.2% v 11.0%, 1.70 (1.03 to 2.80), P=0.04; coronary heart disease 16.3% v 10.1%, 1.74 (1.04 to 2.92), P=0.04). Inclusion of these recovered data in an updated meta-analysis of linoleic acid intervention trials showed non-significant trends toward increased risks of death from coronary heart disease (hazard ratio 1.33 (0.99 to 1.79); P=0.06) and cardiovascular disease (1.27 (0.98 to 1.65); P=0.07).Conclusions Advice to substitute polyunsaturated fats for saturated fats is a key component of worldwide dietary guidelines for coronary heart disease risk reduction. However, clinical benefits of the most abundant polyunsaturated fatty acid, omega 6 linoleic acid, have not been established. In this cohort, substituting dietary linoleic acid in place of saturated fats increased the rates of death from all causes, coronary heart disease, and cardiovascular disease. An updated meta-analysis of linoleic acid intervention trials showed no evidence of cardiovascular benefit. These findings could have important implications for worldwide dietary advice to substitute omega 6 linoleic acid, or polyunsaturated fats in general, for saturated fats.Trial registration Clinical trials NCT01621087
Dietary omega-6 fatty acid lowering increases bioavailability of omega-3 polyunsaturated fatty acids in human plasma lipid pools
Dietary linoleic acid (LA, 18:2n-6) lowering in rats reduces n-6 polyunsaturated fatty acid (PUFA) plasma concentrations and increases n-3 PUFA (eicosapentaenoic (EPA) and docosahexaenoic acid (DHA)) concentrations
Re-evaluation of the traditional diet-heart hypothesis: analysis of recovered data from Minnesota Coronary Experiment (1968-73)
OBJECTIVE: To examine the traditional diet-heart hypothesis through recovery and analysis of previously unpublished data from the Minnesota Coronary Experiment (MCE) and to put findings in the context of existing diet-heart randomized controlled trials through a systematic review and meta-analysis.
DESIGN: The MCE (1968-73) is a double blind randomized controlled trial designed to test whether replacement of saturated fat with vegetable oil rich in linoleic acid reduces coronary heart disease and death by lowering serum cholesterol. Recovered MCE unpublished documents and raw data were analyzed according to hypotheses prespecified by original investigators. Further, a systematic review and meta-analyses of randomized controlled trials that lowered serum cholesterol by providing vegetable oil rich in linoleic acid in place of saturated fat without confounding by concomitant interventions was conducted.
SETTING: One nursing home and six state mental hospitals in Minnesota, United States.
PARTICIPANTS: Unpublished documents with completed analyses for the randomized cohort of 9423 women and men aged 20-97; longitudinal data on serum cholesterol for the 2355 participants exposed to the study diets for a year or more; 149 completed autopsy files.
INTERVENTIONS: Serum cholesterol lowering diet that replaced saturated fat with linoleic acid (from corn oil and corn oil polyunsaturated margarine). Control diet was high in saturated fat from animal fats, common margarines, and shortenings.
MAIN OUTCOME MEASURES: Death from all causes; association between changes in serum cholesterol and death; and coronary atherosclerosis and myocardial infarcts detected at autopsy.
RESULTS: The intervention group had significant reduction in serum cholesterol compared with controls (mean change from baseline -13.8%v-1.0%; P<0.001). Kaplan Meier graphs showed no mortality benefit for the intervention group in the full randomized cohort or for any prespecified subgroup. There was a 22% higher risk of death for each 30 mg/dL (0.78 mmol/L) reduction in serum cholesterol in covariate adjusted Cox regression models (hazard ratio 1.22, 95% confidence interval 1.14 to 1.32; P<0.001). There was no evidence of benefit in the intervention group for coronary atherosclerosis or myocardial infarcts. Systematic review identified five randomized controlled trials for inclusion (n=10,808). In meta-analyses, these cholesterol lowering interventions showed no evidence of benefit on mortality from coronary heart disease (1.13, 0.83 to 1.54) or all cause mortality (1.07, 0.90 to 1.27).
CONCLUSIONS: Available evidence from randomized controlled trials shows that replacement of saturated fat in the diet with linoleic acid effectively lowers serum cholesterol but does not support the hypothesis that this translates to a lower risk of death from coronary heart disease or all causes. Findings from the Minnesota Coronary Experiment add to growing evidence that incomplete publication has contributed to overestimation of the benefits of replacing saturated fat with vegetable oils rich in linoleic acid
Lowering dietary linoleic acid reduces bioactive oxidized linoleic acid metabolites in humans
Linoleic acid (LA) is the most abundant polyunsaturated fatty acid in human diets, a major component of human tissues, and the direct precursor to the bioactive oxidized LA metabolites (OXLAMs), 9- and 13 hydroxy-octadecadienoic acid (9- and 13-HODE) and 9- and 13-oxo-octadecadienoic acid (9- and 13-oxoODE). These four OXLAMs have been mechanistically linked to pathological conditions ranging from cardiovascular disease to chronic pain. Plasma OXLAMs, which are elevated in Alzheimer’s dementia and non-alcoholic steatohepatitis, have been proposed as biomarkers useful for indicating the presence and severity of both conditions. Because mammals lack the enzymatic machinery needed for de novo LA synthesis, the abundance of LA and OXLAMs in mammalian tissues may be modifiable via diet. To examine this issue in humans, we measured circulating LA and OXLAMs before and after a 12-week LA lowering dietary intervention in chronic headache patients. Lowering dietary LA significantly reduced the abundance of plasma OXLAMs, and reduced the LA content of multiple circulating lipid fractions that may serve as precursor pools for endogenous OXLAM synthesis. These results show that lowering dietary LA can reduce the synthesis and/or accumulation of oxidized LA derivatives that have been implicated in a variety of pathological conditions. Future studies evaluating the clinical implications of diet-induced OXLAM reductions are warranted
Health and environmental applications of gut microbiome: A review
Life on Earth harbours an unimaginable diversity of microbial communities. Among these, gut microbiome, the ecological communities of commensal, symbionts (bacteria and bacteriophages) are a unique assemblage of microbes. This microbial population of animal gut helps in performing organism's physiological processes to stay healthy and fit. The role of these microbial communities is immense. They continually maintain interrelation with the intestinal mucosa in a subtle equilibrium and help the gut for different functions ranging from metabolism to immunologic functions like upgradation of nutrient-poor diets, aid in digestion of recalcitrant food components, protection from pathogens, contribute to inter-And intra-specific communication, affecting the efficiency as disease vectors etc. The microbial diversity in the gut depends upon environmental competition between microbes, their sieving effects and subsequent elimination. Due to wide diversity of anatomy and physiology of the digestive tracts and food habits, the gut microbiome also differs broadly among animals. Stochastic factors through the history of colonization of the microbiome in a species and in situ evolution are likely to establish interspecies diversity. Moreover, the microbes offer enormous opportunity to discover novel species for therapeutic and/or biotechnological applications. In this manuscript, we review the available knowledge on gut microbiome, emphasising their role in health and health related applications in human. © 2017 Soumya Chatterjee et al., published by De Gruyter Open 2017
DELETION WILL BE MY EPITAPH: JOURANLISTS DELETION PRACTICES ON TWITTER
This study seeks to understand the practice of the deletion of tweets by journalists while arguing that these uniquely reflect the contemporary fragility of archiving and journalism – two human enterprises central to societies’ ability to reflexively consider the past and present and democratically chart a future course. From the perspective of journalism as a profession, it argues that the study of tweet deletion is a means of examining the constraints under which journalists operate today including the occupational precarity and polarized public sphere with which they contend. From the web archival perspective, this study methodologically informs scholars who are relying on public tweets as a source for their research as well as other social actors – such as NGOs, activists, politicians, cultural producers – who rely on social media as a web archive in their public activities. It proposes to identify some of the voices that will be removed from this public square as it becomes a public archive and highlights the proactive ephemerality of journalistic social media content. Based on interviews conducted in the winter of 2019 with 17 journalists working in New York City, the study examines how journalists perceive the action of deleting their tweets and how they justify it
Recommended from our members
Public Record Under Threat: News and the Archive in the Age of Digital Distribution
The evolution of the internet has created a vast storehouse of information, both current and historical, and all at the fingertips of the general public as well as journalists. But as we find ourselves apparently saturated by information and overwhelmed by its sources, we face a potential crisis of preservation as we seek—and often fail—to archive all manner of digital content.
News organizations have been slow to recognize and respond to the preservation challenges presented by digital technology. As a result, newsroom discussions about preservation and archiving are few and far between. However, professionals such as librarians, archivists and technologists outside of news industry are having these conversations, about retaining both conventional news content and online news, as well as about the problems inherent in trying to preserve the multitude of digital and data projects.
This shift from paper (or film) to digital record prompted the title of the conference that is the focus of this report: “Public Record Under Threat: News and the Archive in the Age of Digital Distribution.” The conference, the fourth and last in this phase of the Tow Center for Digital Journalism’s Platforms & Publishers series, arose from a recognition of the problems this shift has created in the archive..
While archivists and librarians generally agree that preservation of news content is an acute concern that must be addressed quickly, platforms and publishers often express doubt about whether they have anything to contribute to the discussion. Getting this crowd together in the same room—archivists, journalists and technologists—is a first step toward defining the problem, and subsequently mapping out solutions
Standards rule? Regulations, literacies and algorithms in times of transition
In this panel we seek to reflect upon the theme "internet rules" by drawing on the notion of standards, developed in Science and Technology Studies. The work of Susan Leigh Star lays a foundation for considering the relationships between rules, standards and algorithms as forms of infrastructure. In the panel, we explore the production of standards as they become transparent infrastructures, heeding Star and Lampland's call to restore these standards' "historical development, their political consequences, and the smoke-filled rooms always attached to decisions made about them" (2009:13). Standards – and algorithms – are rarely queried, as they promise and embody efficiency and order. Indeed, modernity may be described as a concentrated, relentless effort to contain the accidental, the arbitrary, the residual; to categorize, order, and routinize the unexpected; and to preclude the exceptional and unpredictable (Bauman, 1991) – in a word: to standardize. As Larkin writes, it is difficult to separate an analysis of infrastructures such as standards from the modernist belief that by promoting order, "infrastructures bring about change, and through change they enact progress, and through progress we gain freedom" (2013:332). It is ironic, then, that standards are distributed unevenly across the sociocultural landscape, that they are increasingly linked and integrated with one another, and that they codify, embody or prescribe social values that often carry great consequences for individuals and groups (Star and Lampland, 2009:5). In this context, the four papers and the moderator of this panel explore the meaning of contemporary standardization practices in such diverse fields as memory applications, crowd funding, biometric identification and national archiving, and internet literacy – viewing them as empirically distinct yet theoretically interrelated attempts to impose order in times of growing uncertainly. Together, they address two tensions that inform contemporary standardization efforts, regarding standards as an encounter between analogue and digital objects and practices; and as dialectic of invisibility and transparency, a pragmatic and symbolic endeavor